Thread Pool
   HOME

TheInfoList



OR:

In
computer programming Computer programming is the process of performing a particular computation (or more generally, accomplishing a specific computing result), usually by designing and building an executable computer program. Programming involves tasks such as anal ...
, a thread pool is a
software design pattern In software engineering, a software design pattern is a general, reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine co ...
for achieving concurrency of execution in a computer program. Often also called a replicated workers or worker-crew model, a thread pool maintains multiple threads waiting for tasks to be allocated for
concurrent Concurrent means happening at the same time. Concurrency, concurrent, or concurrence may refer to: Law * Concurrence, in jurisprudence, the need to prove both ''actus reus'' and ''mens rea'' * Concurring opinion (also called a "concurrence"), a ...
execution by the supervising program. By maintaining a pool of threads, the model increases performance and avoids latency in execution due to frequent creation and destruction of threads for short-lived tasks. The number of available threads is tuned to the computing resources available to the program, such as a parallel task queue after completion of execution.


Performance

The size of a thread pool is the number of threads kept in reserve for executing tasks. It is usually a tunable parameter of the application, adjusted to optimize program performance. Deciding the optimal thread pool size is crucial to optimize performance. One benefit of a thread pool over creating a new thread for each task is that thread creation and destruction overhead is restricted to the initial creation of the pool, which may result in better performance and better system
stability Stability may refer to: Mathematics *Stability theory, the study of the stability of solutions to differential equations and dynamical systems ** Asymptotic stability ** Linear stability ** Lyapunov stability ** Orbital stability ** Structural sta ...
. Creating and destroying a thread and its associated resources can be an expensive process in terms of time. An excessive number of threads in reserve, however, wastes memory, and context-switching between the runnable threads invokes performance penalties. A socket connection to another network host, which might take many CPU cycles to drop and re-establish, can be maintained more efficiently by associating it with a thread that lives over the course of more than one network transaction. Using a thread pool may be useful even putting aside thread startup time. There are implementations of thread pools that make it trivial to queue up work, control concurrency and sync threads at a higher level than can be done easily when manually managing threads. In these cases the performance benefits of use may be secondary. Typically, a thread pool executes on a single computer. However, thread pools are conceptually related to
server farm A server farm or server cluster is a collection of computer servers, usually maintained by an organization to supply server functionality far beyond the capability of a single machine. They often consist of thousands of computers which require ...
s in which a master process, which might be a thread pool itself, distributes tasks to worker processes on different computers, in order to increase the overall throughput.
Embarrassingly parallel In parallel computing, an embarrassingly parallel workload or problem (also called embarrassingly parallelizable, perfectly parallel, delightfully parallel or pleasingly parallel) is one where little or no effort is needed to separate the problem ...
problems are highly amenable to this approach. The number of threads may be dynamically adjusted during the lifetime of an application based on the number of waiting tasks. For example, a web server can add threads if numerous web page requests come in and can remove threads when those requests taper down. The cost of having a larger thread pool is increased resource usage. The algorithm used to determine when to create or destroy threads affects the overall performance: * Creating too many threads wastes resources and costs time creating the unused threads. * Destroying too many threads requires more time later when creating them again. * Creating threads too slowly might result in poor client performance (long wait times). * Destroying threads too slowly may starve other processes of resources.


See also

*
Asynchrony (computer programming) Asynchrony, in computer programming, refers to the occurrence of events independent of the main program flow and ways to deal with such events. These may be "outside" events such as the arrival of signals, or actions instigated by a program that ...
* Object pool pattern *
Concurrency pattern In software engineering, concurrency patterns are those types of design patterns that deal with the multi-threaded programming paradigm. Examples of this class of patterns include: * Active Object * Balking pattern * Barrier * Double-checked loc ...
*
Grand Central Dispatch Grand Central Dispatch (GCD or libdispatch), is a technology developed by Apple Inc. to optimize application support for systems with multi-core processors and other symmetric multiprocessing systems. It is an implementation of task paralleli ...
*
Parallel Extensions Parallel Extensions was the development name for a managed concurrency library developed by a collaboration between Microsoft Research and the CLR team at Microsoft. The library was released in version 4.0 of the .NET Framework. It is composed ...
*
Parallelization Parallel computing is a type of computation Computation is any type of arithmetic or non-arithmetic calculation that follows a well-defined model (e.g., an algorithm). Mechanical or electronic devices (or, historically, people) that perform ...
*
Server farm A server farm or server cluster is a collection of computer servers, usually maintained by an organization to supply server functionality far beyond the capability of a single machine. They often consist of thousands of computers which require ...
*
Staged event-driven architecture The staged event-driven architecture (SEDA) refers to an approach to software architecture that decomposes a complex, event-driven application into a set of stages connected by queues. It avoids the high overhead associated with thread-based concu ...


References


External links

*
Query by Slice, Parallel Execute, and Join: A Thread Pool Pattern in Java
by Binildas C. A. *
Thread pools and work queues
by Brian Goetz *
A Method of Worker Thread Pooling
by Pradeep Kumar Sahu *
Work Queue
by Uri Twig: C++ code demonstration of pooled threads executing a work queue. *
Windows Thread Pooling and Execution Chaining
*
Smart Thread Pool
by Ami Bar *
Programming the Thread Pool in the .NET Framework
by David Carmona *

by Amir Kirsh *
Practical Threaded Programming with Python: Thread Pools and Queues
by Noah Gift *
Optimizing Thread-Pool Strategies for Real-Time CORBA
by Irfan Pyarali, Marina Spivak,
Douglas C. Schmidt Douglas C. Schmidt (born July 18, 1962) is a computer scientist and author in the fields of object-oriented programming, distributed computing and design patterns. Biography In August 1994 he joined the faculty of Washington University, St. Lo ...
and Ron Cytron *
Deferred cancellation. A behavioral pattern
by Philipp Bachmann *
A C++17 Thread Pool for High-Performance Scientific Computing
by Barak Shoshany {{Design Patterns patterns Threads (computing) Software design patterns Concurrent computing Parallel computing